Regularization Methods for Multi - Output Learning

نویسندگان

  • Lorenzo Rosasco
  • Yi-Chieh Wu
  • Phillip Isola
چکیده

• Suppose we are attempting to model the buying preferences of several consumers based on past purchases, e.g. as in the Netflix recommender system. We assume that people with similar tastes tend to buy similar items and their buying history is related. Inferring the preferences for a customer based only on his past purchases may be tough, because that customer may not have rated enough movies or made enough purchases. If we can add information from other customers with similar tastes, then we can essentially increase the number of data samples and hopefully also increase classification accuracy, thereby prompting customers to rent movies attuned to their tastes and be more satisfied with the overall Netflix service. In this case, each consumer is modeled as a task and their previous preferences are the corresponding training set.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning output kernels for multi-task problems

Simultaneously solving multiple related learning tasks is beneficial under a variety of circumstances, but the prior knowledge necessary to correctly model task relationships is rarely available in practice. In this paper, we develop a novel kernel-based multi-task learning technique that automatically reveals structural inter-task relationships. Building over the framework of output kernel lea...

متن کامل

Learning with Limited Supervision by Input and Output Coding

In many real-world applications of supervised learning, only a limited number of labeled examples are available because the cost of obtaining high-quality examples is high or the prediction task is very specific. Even with a relatively large number of labeled examples, the learning problem may still suffer from limited supervision as the dimensionality of the input space or the complexity of th...

متن کامل

Squared-loss Mutual Information Regularization: A Novel Information-theoretic Approach to Semi-supervised Learning

We propose squared-loss mutual information regularization (SMIR) for multi-class probabilistic classification, following the information maximization principle. SMIR is convex under mild conditions and thus improves the nonconvexity of mutual information regularization. It offers all of the following four abilities to semi-supervised algorithms: Analytical solution, out-of-sample/multi-class cl...

متن کامل

A unifying framework for vector-valued manifold regularization and multi-view learning

This paper presents a general vector-valued reproducing kernel Hilbert spaces (RKHS) formulation for the problem of learning an unknown functional dependency between a structured input space and a structured output space, in the Semi-Supervised Learning setting. Our formulation includes as special cases Vector-valued Manifold Regularization and Multi-view Learning, thus provides in particular a...

متن کامل

A Unifying Framework in Vector-valued Reproducing Kernel Hilbert Spaces for Manifold Regularization and Co-Regularized Multi-view Learning

This paper presents a general vector-valued reproducing kernel Hilbert spaces (RKHS) framework for the problem of learning an unknown functional dependency between a structured input space and a structured output space. Our formulation encompasses both Vector-valued Manifold Regularization and Co-regularized Multi-view Learning, providing in particular a unifying framework linking these two imp...

متن کامل

Tree-Guided Group Lasso for Multi-Task Regression with Structured Sparsity

We consider the problem of learning a sparse multi-task regression, where the structure in the outputs can be represented as a tree with leaf nodes as outputs and internal nodes as clusters of the outputs at multiple granularity. Our goal is to recover the common set of relevant inputs for each output cluster. Assuming that the tree structure is available as prior knowledge, we formulate this p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010